Facebook has announced plans to rank the visibility of news sources based on whether they are “broadly trusted.” The mainstream media is reporting this as a win for users, and an abdication of responsibility on Facebook’s part. This couldn’t be further from the truth.

If Facebook really wanted to let users determine what news sources they trust, they already have a perfectly functioning system for doing that. It’s called “likes.” If the platform were truly giving power back to the users, they would let them see news from sources that they choose, via the liking system.

Instead, Facebook is going to rank and de-rank news sites based on complicated user surveys aimed at discovering “broadly trusted” sources. Here’s how Mark Zuckerberg explained it:

We decided that having the community determine which sources are broadly trusted would be most objective.

Here’s how this will work. As part of our ongoing quality surveys, we will now ask people whether they’re familiar with a news source and, if so, whether they trust that source. The idea is that some news organizations are only trusted by their readers or watchers, and others are broadly trusted across society even by those who don’t follow them directly. (We eliminate from the sample those who aren’t familiar with a source, so the output is a ratio of those who trust the source to those who are familiar with it.)

Of course, if Zuckerberg really wanted to let “the community determine” what news sources they want to read, their “like” system already serves as a daily multi-billion user survey. What Facebook is subtly telling users is, they don’t care what you like.

In his announcement, Zuckerberg was transparent about his objectives. He wants to socially engineer his users away from “polarization” and “misinformation.”

There’s too much sensationalism, misinformation and polarization in the world today. Social media enables people to spread information faster than ever before, and if we don’t specifically tackle these problems, then we end up amplifying them. That’s why it’s important that News Feed promotes high quality news that helps build a sense of common ground.

Zuckerberg’s comment is a veiled attack on his own users. Left to their own devices, Zuckerberg believes that they gravitate towards “misinformation” and “polarization,” and need to be fed “high-quality news” by a Facebook algorithm, instead of being free to choose the sources they like.

Facebook’s goal of finding media sources that are not “polarized” will be difficult in a media ecosystem that is separated into the mainstream media on one hand, who continue — with decreasing effectiveness — to claim a lack of bias, and the alternative media on the other hand, who are open about their biases.

In Righteous Indignation, Andrew Breitbart praised the Huffington Post for being “openly and loudly and radically leftist,” in contrast to publications like The New York Times that conceal their biases and aspire to the same kind of unbiased, “broadly trusted” status that Facebook is planning to promote.

Unless so-called “unbiased” media organizations carefully control for viewpoint diversity — and none of them are particularly strict about it — they will inevitably fall into partisan groupthink. That’s precisely what has happened to the mainstream media.

To be biased is to be human. In searching for “broadly trusted” news sources, with all the connotations of unbiasedness, Mark Zuckerberg is looking for… non-humans.

It’s almost like he wants the memes to keep coming.

Facebook admits social media threat to democracy

BY ROB LEVER (AFP)    

Facebook acknowledged Monday that the explosion of social media poses a potential threat to democracy, pledging to tackle the problem head-on and turn its powerful platform into a force for "good."

The comments from the world's biggest social network were its latest response to intense criticism for failing to stop the spread of misinformation among its two billion users -- most strikingly leading up to the 2016 US election.

In a blog post, Facebook civic engagement chief Samidh Chakrabarti said he was "not blind to the damage that the internet can do to even a well-functioning democracy."

"In 2016, we at Facebook were far too slow to recognize how bad actors were abusing our platform," he said. "We're working diligently to neutralize these risks now."

The post -- one in a series dubbed "hard questions" -- was part of a high-profile push by Facebook to reboot its image, including with the announcement last week that it would let users "rank" the trustworthiness of news sources to help stem the flow of false news.

"We're as determined as ever to fight the negative influences and ensure that our platform is unquestionably a source for democratic good," said Katie Harbath, Facebook's head of global politics and government outreach, in an accompanying statement.

Facebook, along with Google and Twitter, faces global scrutiny for facilitating the spread of bogus news -- some of it directed by Russia -- ahead of the US election, the Brexit vote and other electoral battles.

The social network has concluded that Russian actors created 80,000 posts that reached around 126 million people in the United States over a two-year period.

"It's abhorrent to us that a nation-state used our platform to wage a cyberwar intended to divide society," Chakrabarti said.

"This was a new kind of threat that we couldn't easily predict, but we should have done better. Now we're making up for lost time," he said.

Chakrabarti pointed at Facebook's pledge last year to identify the backers of political advertisements -- while also stressing the need to tread carefully, citing the example of rights activists who could be endangered if they are publicly identified on social media.

He also elaborated on the decision to let Facebook's users rank the "trustworthiness" of news sources, saying: "We don't want to be the arbiters of truth, nor do we imagine this is a role the world would want for us."

While acknowledging concerns over the rise of "echo chambers," he argued that "the best deterrent will ultimately be a discerning public."

- 'What could possibly go wrong?' -

News Corp. founder Rupert Murdoch says Facebook can attack the "fake news" problem better ...
News Corp. founder Rupert Murdoch says Facebook can attack the "fake news" problem better by paying "trusted" news organizations
Dia Dipasupil, GETTY IMAGES NORTH AMERICA/AFP/File

Facebook's plan to rank news organizations based on user "trust" surveys has drawn a mixed response.

Renee DiResta of the nonprofit group Data for Democracy was optimistic.

"This is great news and a long time coming. Google has been ranking for quality for a long time, it's a bit baffling how long it took for social networks to get there," she wrote on Twitter.

But technology columnist Shelly Palmer warned that Facebook appeared to be equating trust and truth with what the public believes -- what some call "wikiality."

"Wikiality is Facebook's answer to fake news, alternative facts, and truthiness," Palmer wrote. "Facebook, the social media giant, is going to let you rank the news you think is most valuable. What could possibly go wrong?"

For media writer Matthew Ingram, the changes "not only won't fix the problem of 'fake news,' but could actually make it worse instead of better."

"Why? Because misinformation is almost always more interesting than the truth," he wrote in the Columbia Journalism Review.

News Corp. founder and executive chairman Rupert Murdoch also expressed skepticism, suggesting Facebook should instead pay "carriage feeds" to trusted news organizations, following the example of cable TV operators.

"I have no doubt that Mark Zuckerberg is a sincere person, but there is still a serious lack of transparency that should concern publishers and those wary of political bias at these powerful platforms," Murdoch said in a statement issued by his group, which publishes the Wall Street Journal and newspapers in Britain and Australia.

Apple CEO Tim Cook Says He Wouldn't Let A Child Use Social Media

Apple CEO Tim Cook Says He Wouldn't Let A Child Use Social Media (etechworld.cf)

submitted  ago by rishvi to news (+17|-3)